243 research outputs found

    Storytelling in the Metaverse: From Desktop to Immersive Virtual Reality Storyboarding

    Get PDF
    Creatives from the animation and film industries have always been experimenting with innovative tools and methodologies to improve the creation of prototypes of their visual sequences before bringing them to life. In recent years, as realistic real-time rendering techniques have emerged, the increasing popularity of virtual reality (VR) can lead to new approaches and solutions, leveraging the immersive and interactive features provided by 3D immersive experiences. A 3D desktop application and a novel storyboarding pipeline, which can automatically generate a storyboard including camera details and a textual description of the actions performed in three-dimensional environments, have already been investigated in previous work. The aim was to exploit new technologies to improve existing 3D storytelling approaches, thus providing a software solution for expert and novice storyboarders. This research investigates 3D storyboarding in immersive virtual reality (IVR) to move toward a new storyboarding paradigm. IVR systems provide peculiarities such as body-controlled exploration of the 3D scene and a head-dependant camera view that can extend features of traditional storyboarding tools. The proposed system enables users to set up the virtual stage, adding elements to the scene and exploring the environment as they build it. After that, users can select the available characters or the camera, control them in first person, position them in the scene, and perform actions selecting from a list of options, each paired with a corresponding animation. Relying on the concept of state-machine, the system can automatically generate the list of available actions depending on the context. Finally, the descriptions for each storyboard panel are automatically generated based on the history of activities performed. The proposed application maintains all the functionalities of the desktop version and can be effectively used to create storyboards in immersive virtual environments

    Augmented Reality in Industry 4.0

    Get PDF
    Since the origins of Augmented Reality (AR), industry has always been one of its prominent application domains. The recent advances in both portable and wearable AR devices and the new challenges introduced by the fourth industrial revolution (renowned as industry 4.0) further enlarge the applicability of AR to improve the productiveness and to enhance the user experience. This paper provides an overview on the most important applications of AR regarding the industry domain. Key among the issues raised in this paper are the various applications of AR that enhance the user's ability to understand the movement of mobile robot, the movements of a robot arm and the forces applied by a robot. It is recommended that, in view of the rising need for both users and data privacy, technologies which compose basis for Industry 4.0 will need to change their own way of working to embrace data privacy

    A Comparison of Three Different NeuroTag Visualization Media: Brain Visual Stimuli by Monitor, Augmented and Virtual Reality Devices

    Get PDF
    Brain-Computer Interfaces (BCIs) proved to overcome some limitations of other input modes (e.g., gestures, voice, haptic, etc.). BCIs are able to detect the brain activity, thus identifying searched patterns. When a specific brain activity is recognized, a well-defined action can be triggered, thus implementing a human-machine interaction paradigm. BCIs can be used in different domains ranging from industry to services for impaired people. This paper considers BCIs that can be designed and developed by the NextMind, which is a small and ergonomics device to capture the activity of the visual cortex. Objects called NeuroTags can be inserted in both 2D and 3D scenes; these objects act like switches when the user is able to focus on them. The aim of this work is to evaluate different NeuroTag configurations (varying in terms of size and distance) as well as different visualization devices: a monitor, a virtual reality head-mounted display, and an augmented reality head-mounted display. User tests outline that the best tradeoff between robustness and selection speed is obtained by medium-size and medium-spaced NeuroTags; on the other hand, monitor visualization outperforms the AR solution, whereas it is not possible to identify statistically significant differences between monitor-VR and AR-VR

    Harmonize: a shared environment for extended immersive entertainment

    Get PDF
    Virtual reality (VR) and augmented reality (AR) applications are very diļ¬€use nowadays. Moreover, recent technology innovations led to the diļ¬€usion of commercial head-mounted displays (HMDs) for immersive VR: users can enjoy entertainment activities that ļ¬ll their visual ļ¬elds, experiencing the sensation of physical presence in these virtual immersive environments (IEs). Even if AR and VR are mostly used separately, they can be eļ¬€ectively combined to provide a multi-user shared environment (SE), where two or more users perform some speciļ¬c tasks in a cooperative or competitive way, providing a wider set of interactions and use cases compared to immersive VR alone. However, due to the diļ¬€erences between the two technologies, it is diļ¬ƒcult to develop SEs oļ¬€ering a similar experience for both AR and VR users. This paper presents Harmonize, a novel framework to deploy applications based on SEs with a comparable experience for both AR and VR users. Moreover, the framework is hardware-independent and it has been designed to be as much extendable to novel hardware as possible. An immersive game has been designed to test and to evaluate the validity of the proposed framework. The assessment of the system through the System Usability Scale (SUS) questionnaire and the Game Experience Questionnaire (GEQ) shows a positive evaluation

    3D SCENE RECONSTRUCTION SYSTEM BASED ON A MOBILE DEVICE

    Get PDF
    Augmented reality (AR) and virtual reality (VR) applications can take advantage of efficient digitalization of real objects as reconstructed elements can allow users a better connection between real and virtual worlds than using pre-set 3D CAD models. Technology advances contribute to the spread of AR and VR technologies, which are always more diffuse and popular. On the other hand, the design and implementation of virtual and extended worlds is still an open problem; affordable and robust solutions to support 3D object digitalization is still missing. This work proposes a reconstruction system that allows users to receive a 3D CAD model starting from a single image of the object to be digitalized and reconstructed. A smartphone can be used to take a photo of the object under analysis and a remote server performs the reconstruction process by exploiting a pipeline of three Deep Learning methods. Accuracy and robustness of the system have been assessed by several experiments and the main outcomes show how the proposed solution has a comparable accuracy (chamfer distance) with the state-of-the-art methods for 3D object reconstruction

    An Evaluation of Game Usability in Shared Mixed and Virtual Environments

    Get PDF
    Augmented reality (AR) and virtual reality (VR) technologies are increasingly becoming more pervasive and important in the information technology area. Thanks to the technological improvements, desktop interfaces are being replaced by immersive VR devices that offer a more compelling game experience. AR games have started to draw attention of researchers and companies for their ability to exploit both the real and virtual environments. New fascinating challenges are generated by the possibility of designing hybrid games that allow several users to access shared environments exploiting the features of both AR and VR devices. However, the user experience and usability can be affected by several parameters, such as the field of view (FoV) of the employed devices or the realism of the scene. The work presented in this chapter aims to assess the impact of the FoV on the usability of the interfaces in a first-person shooter game. Two players, interacting with AR (first player) and VR (second player) devices, can fight each other in a large game environment. Although we cannot ascertain that different FoVs have affected the game usability, users considered the narrow FoV interfaces to be less usable, even though they could freely move around the real environment

    Heavy-flavor transport and hadronization in pp collisions

    Full text link
    Recent experimental results on the Lambda_c/D^0 ratio in proton-proton collisions have revealed a significant enhancement compared to expectations based on universal fragmentation fractions/functions across different colliding systems, from e+e- to pp. This unexpected enhancement has sparked speculation about the potential effects of a deconfined medium impacting hadronization, previously considered exclusive to heavy-ion collisions. In this study, we propose a novel approach that assumes the formation of a small, deconfined, and expanding fireball even in pp collisions, where charm quarks can undergo rescattering and hadronization. We make use of the same in-medium hadronization mechanism developed for heavy-ion collisions, which involves local color-neutralization through recombination of charm quarks with nearby opposite color charges from the background fireball. Our model incorporates the presence of diquark excitations in the hot medium, which promotes the formation of charmed baryons. Moreover, the recombination process, involving closely aligned partons from the same fluid cell, effectively transfers the collective flow of the system to the final charmed hadrons. We show that this framework can qualitatively reproduce the observed experimental findings in heavy-flavor particle-yield ratios, pTp_T-spectra and elliptic-flow coefficients. Our results provide new, complementary supporting evidence that the collective phenomena observed in small systems naturally have the same origin as those observed in heavy-ion collision
    • ā€¦
    corecore